15 research outputs found

    Litigating Presidential Signing Statements

    Get PDF
    In response to President George W. Bush\u27s aggressive use of presidential signing statements, several members of Congress as well as a prominent Taskforce of the American Bar Association have proposed legislation to provide for judicial review of signing statements. These critics assert that the President must veto legislation with which he disagrees, rather than use signing statements to refuse to enforce statutes that he signs into law. This article explores whether Congress can litigate presidential signing statements, concluding that they are not justiciable even if Congress enacts a law granting itself standing to raise such a challenge. Congress might be able to piggyback on litigation brought by private parties through the procedural tools of intervention and amicus. However, private parties may also be hard-pressed to challenge signing statements, even if the President follows through on the views expressed in his signing statements and declines to enforce the laws as written. As a result, Congress must exercise its political powers if it wishes to confront the President over his signing statements

    The Poverty Defense

    Get PDF

    Expanding Civil Rights to Combat Digital Discrimination on the Basis of Poverty

    Get PDF
    Low-income people suffer from digital discrimination on the basis of their socio-economic status. Automated decision-making systems, often powered by machine learning and artificial intelligence, shape the opportunities of those experiencing poverty because they serve as gatekeepers to the necessities of modern life. Yet in the existing legal regime, it is perfectly legal to discriminate against people because they are poor. Poverty is not a protected characteristic, unlike race, gender, disability, religion or certain other identities. This lack of legal protection has accelerated digital discrimination against the poor, fueled by the scope, speed, and scale of big data networks. This Article highlights four areas where data-centric technologies adversely impact low-income people by excluding them from opportunities or targeting them for exploitation: tenant screening, credit scoring, higher education, and targeted advertising. Currently, there are numerous proposals to combat algorithmic bias by updating analog-era civil rights laws for our datafied society, as well as to bolster civil rights within comprehensive data privacy protections and algorithmic accountability standards. On this precipice for legislative reform, it is time to include socio-economic status as a protected characteristic in antidiscrimination laws for the digital age. This Article explains how protecting low-income people within emerging legal frameworks would provide a valuable counterweight against opaque and unaccountable digital discrimination, which undermines any vision of economic justice

    Feminism, Democracy and the War on Women

    Get PDF

    The Class Differential in Privacy Law

    Get PDF
    This article analyzes how privacy law fails the poor. Due to advanced technologies, all Americans are facing corporate and governmental surveillance. However, privacy law is focused on middle-class concerns about limiting the disclosure of personal data so that it is not misused. By contrast, along the welfare-to-work continuum, poor people face privacy intrusions at the time that the state or their employers gather data. This data collection tends to be stigmatizing and humiliating, and it thus not only compounds the harmful effects of living in poverty, but also dampens democratic participation by the poor. The poor interact with the government and low-wage employers in ways that are on-going and interpersonal, and as a result, the right to be left alone embodied in current privacy law does not protect their interests in dignity and autonomy. This article argues that poor Americans experience privacy differently than persons with greater economic resources and that the law, in its constitutional, statutory and common law dimensions, reinforces this differential. This class differential in privacy law has costs not only for the poor, but for all citizens

    Beyond Window Dressing: Public Participation for Marginalized Communities in the Datafied Society

    Get PDF
    We live in a datafied society in which our personal data is being constantly harvested, analyzed, and sold by public and private entities, and yet we have little control over our data and little voice in how it is used. In light of the impacts of algorithmic decision-making systems—including those that run on machine learning and artificial intelligence—there are increasing calls to integrate public participation into the adoption, design, and oversight of these tech tools. Stakeholder input is particularly crucial for members of marginalized groups, who bear the disproportionate harms of data-centric technologies. Yet, recent calls for public participation have been mostly hortatory and without specific strategies or realistic recommendations. As this Article explains, policy makers need not operate from a blank slate. For decades, a variety of American statutory regimes have mandated public participation, such as in the areas of environmental law, land use law, and anti-poverty programs. Such mandates have had outsized effects on communities suffering from economic disadvantage and racial and ethnic discrimination. This Article contends that we should examine these regulatory mandates in thinking about how to include the perspectives of marginalized stakeholders in the datafied society. The core takeaway is that meaningful public participation is extremely challenging and does not happen without intentional and inclusive design. At its best, public input can improve outputs and empower stakeholders. At its worst, it operates as a form of “window dressing,” in which marginalized communities have no real power to effect outcomes, thus generating distrust and alienation. Case studies show that meaningful public participation is most likely to result when there are hard-law requirements for public participation and when decision-makers operate transparently and recognize the value of the public’s expertise. In addition, impacted communities must be provided with capacity-building tools and resources to support their engagement. As legislative proposals to enhance tech accountability—through algorithmic impact assessments, audits, and other tools—gain steam, we must heed these lessons

    Feminism, Democracy and the War on Women

    Get PDF

    Poverty and Communitarianism: Toward a Community-Based Welfare System

    Full text link

    Periods for Profit and the Rise of Menstrual Surveillance

    Get PDF
    Menstruation is being monetized and surveilled, with the voluntary participation of millions of women. Thousands of downloadable apps promise to help women monitor their periods and manage their fertility. These apps are part of the broader, multi-billion dollar, Femtech industry, which sells technology to help women understand and improve their health. Femtech is marketed with the language of female autonomy and feminist empowerment. Despite this rhetoric, Femtech is part of a broader business strategy of data extraction, in which companies are extracting people’s personal data for profit, typically without their knowledge or meaningful consent. Femtech can oppress menstruators in several ways. Menstruators lose control over their personal data and how it is used. Some of these uses can potentially disadvantage women in the workplace, insurance markets, and credit scoring. In addition, these apps can force users into a gendered binary that does not always comport with their identity. Further, period trackers are sometimes inaccurate, leading to unwanted pregnancies. Additionally, the data is nearly impossible to erase, leading some women to be tracked relentlessly across the web with assumptions about their childbearing and fertility. Despite these harms, there are few legal restraints on menstrual surveillance. American data privacy law largely hinges on the concept of notice and consent, which puts the onus on people to protect their own privacy rather than placing responsibility on the entities that gather and use data. Yet notice and consent is a myth because consumers do not read, cannot comprehend, and have no opportunities to negotiate the terms of privacy policies. Notice and consent is an individualistic approach to data privacy that envisions an atomized person pursing their own self-interest in a competitive marketplace. Menstruators’ needs do not fit this model. Accordingly, this Essay seeks to reconceptualize Femtech within an expanded menstrual justice framework that recognizes the tenets of data feminism. In this vision, Femtech would be an empowering and accurate health tool rather than a data extraction device
    corecore